# Drug Discovery

GP MoLFormer Uniq
Apache-2.0
GP-MoLFormer is a chemical language model pretrained on 650 million to 1.1 billion molecular SMILES string representations from ZINC and PubChem, focusing on molecular generation tasks.
Molecular Model Transformers
G
ibm-research
122
1
Materials.selfies Ted
Apache-2.0
A Transformer-based encoder-decoder model specifically designed for molecular representation using SELFIES
Molecular Model Transformers
M
ibm-research
3,343
7
Molformer XL Both 10pct
Apache-2.0
MoLFormer is a chemical language model pre-trained on 1.1 billion molecular SMILES strings from ZINC and PubChem. This version uses 10% samples from each dataset for training.
Molecular Model Transformers
M
ibm-research
171.96k
19
Molgen 7b
Apache-2.0
A large-scale molecular generation model based on the SELFIES molecular language, capable of de novo molecular generation or completing partial molecular structures.
Molecular Model Transformers
M
zjunlp
150
8
Wgan Molecular Graphs
This model uses WGAN-GP and R-GCN architectures to generate novel molecular graph structures, accelerating the drug discovery process.
Molecular Model
W
keras-io
28
5
Chemgpt 1.2B
ChemGPT is a generative molecular modeling tool based on the GPT-Neo model, specializing in molecular generation and research in the field of chemistry.
Molecular Model Transformers
C
ncfrey
409
14
Pretrained Smiles Pubchem10m
This model is a cheminformatics model pretrained on 10 million SMILES strings from the PubChem database, primarily used for molecular representation learning and chemical property prediction.
Molecular Model Transformers
P
pchanda
509
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase